Ernesto Carrella
Data is limited, heterogeneous, some may be qualitative and other may be of low quality.
Cannot use a simple model, either because of the policy question or the nature of the data
We want to:
Calibrate 8 parameters with this
Mission impossible!
Rejection filtering to the rescue.
You can always:
Most likely isn’t well defined when data is limited.
Minimizing an error will mechanically give you a parameter, but will ignore its uncertainty.
Must not look for the best parameters (since it’s meaningless)
List instead all the good enough ones
Do this with rejection filtering
We have built a bio-economic model of the world.
An application to agent-based modeling
Data and model degradation
Do we get the same insights?
In lieu of Indonesia
| Filter Definition | |
|---|---|
| F1 | Landings (trolling and purse seine combined) have never exceeded 15,000 t |
| F2 | Landings of the Usuki trolling fleet are currently between 250 t and 1,850 t |
| F3 | Current spawning potential ratio (SPR) is between 10% and 25% |
| F4 | The current Usuki trolling fishery is comprised of fewer than 60 vessels |
| F5 | The current Usuki trolling fishery landings are 30% or less of the total landings (in wt) |
| F6 | Fishing on the stock by the Usuki troll fishery was initiated 30-45 years before the current time. |
The model has 22 parameters
The ABM is certainly wrong:
Can we still discover anything meaningful?
The collider bias was uncovered by the “rejection sampling”; no need to draw a DAG
The DAG is in the model
Rejection sampling exploits the causal mechanisms without ever needing to draw a DAG
in the computer
Limited applicability ( goldilocks principle)
Lack of a stepping stone
What do we validate on, now?